17 research outputs found
Cubic Regularization is the Key! The First Accelerated Quasi-Newton Method with a Global Convergence Rate of for Convex Functions
In this paper, we propose the first Quasi-Newton method with a global
convergence rate of for general convex functions. Quasi-Newton
methods, such as BFGS, SR-1, are well-known for their impressive practical
performance. However, they may be slower than gradient descent for general
convex functions, with the best theoretical rate of . This gap
between impressive practical performance and poor theoretical guarantees was an
open question for a long period of time. In this paper, we make a significant
step to close this gap. We improve upon the existing rate and propose the Cubic
Regularized Quasi-Newton Method with a convergence rate of . The key
to achieving this improvement is to use the Cubic Regularized Newton Method
over the Damped Newton Method as an outer method, where the Quasi-Newton update
is an inexact Hessian approximation. Using this approach, we propose the first
Accelerated Quasi-Newton method with a global convergence rate of
for general convex functions. In special cases where we can improve the
precision of the approximation, we achieve a global convergence rate of
, which is faster than any first-order method. To make these methods
practical, we introduce the Adaptive Inexact Cubic Regularized Newton Method
and its accelerated version, which provide real-time control of the
approximation error. We show that the proposed methods have impressive
practical performance and outperform both first and second-order methods
Inexact Model: A Framework for Optimization and Variational Inequalities
In this paper we propose a general algorithmic framework for first-order
methods in optimization in a broad sense, including minimization problems,
saddle-point problems and variational inequalities. This framework allows to
obtain many known methods as a special case, the list including accelerated
gradient method, composite optimization methods, level-set methods, proximal
methods. The idea of the framework is based on constructing an inexact model of
the main problem component, i.e. objective function in optimization or operator
in variational inequalities. Besides reproducing known results, our framework
allows to construct new methods, which we illustrate by constructing a
universal method for variational inequalities with composite structure. This
method works for smooth and non-smooth problems with optimal complexity without
a priori knowledge of the problem smoothness. We also generalize our framework
for strongly convex objectives and strongly monotone variational inequalities.Comment: 41 page
Advancing the lower bounds: An accelerated, stochastic, second-order method with optimal adaptation to inexactness
We present a new accelerated stochastic second-order method that is robust to
both gradient and Hessian inexactness, which occurs typically in machine
learning. We establish theoretical lower bounds and prove that our algorithm
achieves optimal convergence in both gradient and Hessian inexactness in this
key setting. We further introduce a tensor generalization for stochastic
higher-order derivatives. When the oracles are non-stochastic, the proposed
tensor algorithm matches the global convergence of Nesterov Accelerated Tensor
method. Both algorithms allow for approximate solutions of their auxiliary
subproblems with verifiable conditions on the accuracy of the solution
The sol-gel synthesis of cotton/TiO2 composites and their antibacterial properties
Presentwork is devoted to investigation of structure and functional properties of hybrid nanomaterials based on
the TiO2-modified cellulose fibers of cotton. The titania hydrosol was successfully prepared using the titanium
tetraisopropoxide as precursor and the nitric acid as peptizing agent via the low-temperature sol–gel synthesis
in aqueous medium and applied to cotton fabric. For cross-linking of titania nanoparticles to cotton the 1,2,3,4-butanetetracarboxylic acid (BTCA) was used as a spacer. The morphology and composition of the surface pure and TiO2 modified cotton fibers were investigated by the scanning electron microscopy (SEM). The cotton/TiO2
composite was characterized by the dielectric permittivity. For the estimation of total titania concentration, all samples were calcined at 650 °C. The antimicrobial activity of the treated TiO2 cotton fibers was investigated against Escherichia coli as a model Gram-negative bacteria after exposure to UV-irradiation for 10 mi
Inexact model: A framework for optimization and variational inequalities
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal method for variational inequalities with composite structure. This method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. We also generalize our framework for strongly convex objectives and strongly monotone variational inequalities
Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, Bregman proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal conditional gradient method and universal method for variational inequalities with composite structure. These method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. As a particular case of our general framework, we introduce relative smoothness for operators and propose an algorithm for VIs with such operator. We also generalize our framework for relatively strongly convex objectives and strongly monotone variational inequalities
Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, Bregman proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal conditional gradient method and universal method for variational inequalities with composite structure. These method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. As a particular case of our general framework, we introduce relative smoothness for operators and propose an algorithm for VIs with such operator. We also generalize our framework for relatively strongly convex objectives and strongly monotone variational inequalities
Gradient methods for problems with inexact model of the objective
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [19] and relative smoothness condition [43]. We analyze gradient method which uses this inexact model and obtain convergence rates for convex and strongly convex problems. To show potential applications of our general framework we consider three particular problems. The first one is clustering by electorial model introduced in [49]. The second one is approximating optimal transport distance, for which we propose a Proximal Sinkhorn algorithm. The third one is devoted to approximating optimal transport barycenter and we propose a Proximal Iterative Bregman Projections algorithm. We also illustrate the practical performance of our algorithms by numerical experiments
Inexact tensor methods and their application to stochastic convex optimization
We propose a general non-accelerated tensor method under inexact information on higher- order derivatives, analyze its convergence rate, and provide sufficient conditions for this method to have similar complexity as the exact tensor method. As a corollary, we propose the first stochastic tensor method for convex optimization and obtain sufficient mini-batch sizes for each derivative